Goto

Collaborating Authors

 mining gold sample


Mining GOLD Samples for Conditional GANs

Neural Information Processing Systems

Conditional generative adversarial networks (cGANs) have gained a considerable attention in recent years due to its class-wise controllability and superior quality for complex generation tasks. We introduce a simple yet effective approach to improving cGANs by measuring the discrepancy between the data distribution and the model distribution on given samples. The proposed measure, coined the gap of log-densities (GOLD), provides an effective self-diagnosis for cGANs while being efficiently, computed from the discriminator. We propose three applications of the GOLD: example re-weighting, rejection sampling, and active learning, which improve the training, inference, and data selection of cGANs, respectively. Our experimental results demonstrate that the proposed methods outperform corresponding baselines for all three applications on different image datasets.


Reviews: Mining GOLD Samples for Conditional GANs

Neural Information Processing Systems

Clarity - The paper is very well written and very clearly structured - Experimental setup is clear and results are well explained Originality While there are several related methods that use the discriminator for estimating likelihood ratios (e.g., [1], [2], and [3]), the proposed method is specific for the conditional case, and is applied in a new way for modifying training and active learning. The paper clearly states that for rejection sampling is an extension of a similar approach for the unconditional case. In terms of novelty, I think the paper passes the required bar. Quality - The method used is sound. I think the paper does a good job in addressing the main issues that can arise in the proposed method, such as using sufficiently trained discriminators (line 132-133).


Mining GOLD Samples for Conditional GANs

Neural Information Processing Systems

Conditional generative adversarial networks (cGANs) have gained a considerable attention in recent years due to its class-wise controllability and superior quality for complex generation tasks. We introduce a simple yet effective approach to improving cGANs by measuring the discrepancy between the data distribution and the model distribution on given samples. The proposed measure, coined the gap of log-densities (GOLD), provides an effective self-diagnosis for cGANs while being efficiently, computed from the discriminator. We propose three applications of the GOLD: example re-weighting, rejection sampling, and active learning, which improve the training, inference, and data selection of cGANs, respectively. Our experimental results demonstrate that the proposed methods outperform corresponding baselines for all three applications on different image datasets.


Mining GOLD Samples for Conditional GANs

Mo, Sangwoo, Kim, Chiheon, Kim, Sungwoong, Cho, Minsu, Shin, Jinwoo

Neural Information Processing Systems

Conditional generative adversarial networks (cGANs) have gained a considerable attention in recent years due to its class-wise controllability and superior quality for complex generation tasks. We introduce a simple yet effective approach to improving cGANs by measuring the discrepancy between the data distribution and the model distribution on given samples. The proposed measure, coined the gap of log-densities (GOLD), provides an effective self-diagnosis for cGANs while being efficiently, computed from the discriminator. We propose three applications of the GOLD: example re-weighting, rejection sampling, and active learning, which improve the training, inference, and data selection of cGANs, respectively. Our experimental results demonstrate that the proposed methods outperform corresponding baselines for all three applications on different image datasets.